Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove test checks for Spark versions before 3.2.0 #11316

Open
wants to merge 5 commits into
base: branch-24.10
Choose a base branch
from

Conversation

sameerz
Copy link
Collaborator

@sameerz sameerz commented Aug 12, 2024

Spark versions before 3.2.0 are no longer supported. This PR removes the following test conditions

  • is_before_spark_312
  • is_before_spark_313
  • is_before_spark_314
  • is_before_spark_320

Fixes #11160

Sameer Raheja added 5 commits August 7, 2024 06:17
@sameerz sameerz added the test Only impacts tests label Aug 12, 2024
@@ -118,7 +105,7 @@ def test_cast_string_date_invalid_ansi(invalid):


# test try_cast in Spark versions >= 320 and < 340
@pytest.mark.skipif(is_before_spark_320() or is_spark_340_or_later() or is_databricks113_or_later(), reason="try_cast only in Spark 3.2+")
@pytest.mark.skipif(is_before_spark_340() or is_databricks113_or_later(), reason="try_cast only in Spark 3.2+")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Isn't this supposed to be is_spark_340_or_later instead of is_before_spark_340?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

2024 copyrights

@@ -224,8 +220,7 @@ def test_dpp_reuse_broadcast_exchange_cpu_scan(spark_tmp_table_factory):
@pytest.mark.parametrize('s_index', list(range(len(_statements))), ids=idfn)
@pytest.mark.parametrize('aqe_enabled', [
'false',
pytest.param('true', marks=pytest.mark.skipif(is_before_spark_320() and not is_databricks_runtime(),
reason='Only in Spark 3.2.0+ AQE and DPP can be both enabled'))
pytest.param('true', marks=pytest.mark.skipif(not is_databricks_runtime(), reason='AQE+DPP not supported on Databricks'))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Previous condition was AND not OR and would only trigger on Spark 3.1.x

Suggested change
pytest.param('true', marks=pytest.mark.skipif(not is_databricks_runtime(), reason='AQE+DPP not supported on Databricks'))

Comment on lines +247 to +248
pytest.param('true', marks=pytest.mark.skipif(not is_databricks_runtime(),
reason='AQE+DPP not supported on Databricks'))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
pytest.param('true', marks=pytest.mark.skipif(not is_databricks_runtime(),
reason='AQE+DPP not supported on Databricks'))

Comment on lines +269 to +270
pytest.param('true', marks=pytest.mark.skipif(not is_databricks_runtime(),
reason='AQE+DPP not supported on Databricks'))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
pytest.param('true', marks=pytest.mark.skipif(not is_databricks_runtime(),
reason='AQE+DPP not supported on Databricks'))

Comment on lines -786 to -787
# special constant values
pytest.param("\"" + optional_whitespace_regex + "(now|today|tomorrow|epoch)" + optional_whitespace_regex + "\"", marks=pytest.mark.xfail(condition=is_before_spark_320(), reason="https://github.com/NVIDIA/spark-rapids/issues/9724")),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This parameter should not have been deleted, just the xfail marker removed, e.g.:

    # special constant values
    "\"" + optional_whitespace_regex + "(now|today|tomorrow|epoch)" + optional_whitespace_regex + "\"",

@@ -192,8 +192,7 @@ def query_map_scalar(spark):


@allow_non_gpu('WindowLocalExec')
@datagen_overrides(seed=0, condition=is_before_spark_314()
or (not is_before_spark_320() and is_before_spark_323())
@datagen_overrides(seed=0, condition=is_before_spark_322()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
@datagen_overrides(seed=0, condition=is_before_spark_322()
@datagen_overrides(seed=0, condition=is_before_spark_323()

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

2024 copyrights

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

2024 copyrights

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is_neg_dec_scale_bug_version should also be updated, as it's using spark_version() comparisons directly against 3.1.1 and 3.1.3.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
test Only impacts tests
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Remove tests specific to Spark 3.1.x
2 participants